Published on Jun 05, 2023
Increasing demand for information exchange is a characteristic of modern civilisation. The transfer of information from the source to its destination has to be done in such a way that the quality of the received information should be as close as possible to the quality of the transmitted information.
The information to be transmitted can be machine generated (e.g., images, computer data) or human generated (e.g., speech). Regardless of its source, the information must be translated into a set of signals optimized for the channel over which we want to send it. The first step is to eliminate the redundant part in order to maximize the information transmission rate. This is achieved by the source encoder block in Figure 1-1. In order to ensure the secrecy of the transmitted information, an encryption scheme must be used. The data must also be protected against perturbations introduced by the communication channel which could lead to misinterpretation of the transmitted message at the receiving end. This protection can be achieved through error control strategies: forward error correction (FEC), i.e., using error correcting codes that are able to correct errors at the receiving end, or automatic repeat request (ARQ) systems.
The modulator block generates a signal suitable for the transmission channel. In the traditional approach, the demodulator block from Figure 1-1 makes a "hard" decision for the received symbol and passes it to the error control decoder block. This is equivalent, in the case of a two level modulation scheme, to decide which of two logical values, say -1 and +1, was transmitted. No information is passed on about how reliable the hard decision is. For example, when a +1 is output by the demodulator, it is impossible to say if it was received as a 0.2 or a 0.99 or a 1.56 value at the input to the demodulator block. Therefore, the information concerning the confidence into the demodulated output is lost in the case of a "hard" decision demodulator.The capacity of a channel, which was first introduced 50 years ago by Claude Shannon, is the theoretical maximum data rate that can be supported by the channel with vanishing error probability.